Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
3D-HEVC robust video watermarking algorithm based on depth map
CAO Haiyan, FENG Gui, HAN Xue, FANG Dingbang, HUANG Xinda
Journal of Computer Applications    2019, 39 (3): 869-873.   DOI: 10.11772/j.issn.1001-9081.2018081676
Abstract602)      PDF (750KB)(375)       Save

Aiming at insufficient robustness of depth map in multi-view 3D video with depth format, a 3D robust video watermarking algorithm based on depth map was proposed. Firstly, a depth map was divided into 4×4 nonoverlapped blocks, the mean square error of each block of pixels was calculated, and a threshold was set to distinguish the texture block and the flat block. Secondly, the block energy value of the texture block was calculated, and a threshold was set according to the calculated result to selectively embed the watermark bits. Finally, the transformed and quantized DC coefficients of each block were obtained and used to construct a 3×3 invertible matrix, then QR decomposition was performed on the invertible matrix and the watermark was embedded in the decomposed Q matrix. The proposed algorithm guarantees that the average Peak Signal to Noise Ratio (PSNR) is a constant, and the average Bit Error Rate (BER) under re-encoding attack with different Quantization Parameter (QP) values (25, 30, 35, 40) is 14.9%. Experimental results show that the algorithm has good robustness and embedding capacity, and has little impact on the quality of video.

Reference | Related Articles | Metrics
Feature selection based on maximum conditional and joint mutual information
MAO Yingchi, CAO Hai, PING Ping, LI Xiaofang
Journal of Computer Applications    2019, 39 (3): 734-741.   DOI: 10.11772/j.issn.1001-9081.2018081694
Abstract1043)      PDF (1284KB)(439)       Save
In the analysis process of high-dimensional data such as image data, genetic data and text data, when samples have redundant features, the complexity of the problem is greatly increased, so it is important to reduce redundant features before data analysis. The feature selection based on Mutual Information (MI) can reduce the data dimension and improve the accuracy of the analysis results, but the existing feature selection methods cannot reasonably eliminate the redundant features because of the single standard. To solve the problem, a feature selection method based on Maximum Conditional and Joint Mutual Information (MCJMI) was proposed. Joint mutual information and conditional mutual information were both considered when selecting features with MCJMI, improving the feature selection constraint. Exerimental results show that the detection accuracy is improved by 6% compared with Information Gain (IG) and minimum Redundancy Maximum Relevance (mRMR) feature selection; 2% compared with Joint Mutual Information (JMI) and Joint Mutual Information Maximisation (JMIM); and 1% compared with LW index with Sequence Forward Search algorithm (SFS-LW). And the stability of MCJMI reaches 0.92, which is better than JMI, JMIM and SFS-LW. In summary the proposed method can effectively improve the accuracy and stability of feature selection.
Reference | Related Articles | Metrics
Distributed particle filter algorithm with low complexity for cooperative blind equalization
WU Di CAO Haifeng GE Lindong PENG Hua
Journal of Computer Applications    2014, 34 (6): 1546-1549.   DOI: 10.11772/j.issn.1001-9081.2014.06.1546
Abstract191)      PDF (610KB)(306)       Save

The traditional blind equalization with single receiver is significantly influenced by fading channel, and has high Bit Err Ratio (BER). In order to improve the BER performance, a Distributed Particle Filter (DPF) algorithm with low complexity for cooperative blind equalization was proposed in cooperative receiver networks. In the proposed algorithm, multiple receivers composed distributed network with no fusion center, estimated the transmitted sequences cooperatively by using the distributed particle filter. In order to reduce the complexity of particle sampling, the prior probability was employed as importance function. Then the minimum consensus algorithm was used to evaluate the approximation of the global likelihood function across the receiver network, therefore, all nodes achieved the same set of particles and weights. The theoretical analysis and simulation results show that the proposed algorithm does not centralize data at a fusion center and reduces the computational complexity. The fully distributed cooperative scheme achieves spatial diversity gain and improves the BER performance.

Reference | Related Articles | Metrics